522 research outputs found

    Improvements on the k-center problem for uncertain data

    Full text link
    In real applications, there are situations where we need to model some problems based on uncertain data. This leads us to define an uncertain model for some classical geometric optimization problems and propose algorithms to solve them. In this paper, we study the kk-center problem, for uncertain input. In our setting, each uncertain point PiP_i is located independently from other points in one of several possible locations {Pi,1,…,Pi,zi}\{P_{i,1},\dots, P_{i,z_i}\} in a metric space with metric dd, with specified probabilities and the goal is to compute kk-centers {c1,…,ck}\{c_1,\dots, c_k\} that minimize the following expected cost Ecost(c1,…,ck)=βˆ‘R∈Ωprob(R)max⁑i=1,…,nmin⁑j=1,…kd(P^i,cj)Ecost(c_1,\dots, c_k)=\sum_{R\in \Omega} prob(R)\max_{i=1,\dots, n}\min_{j=1,\dots k} d(\hat{P}_i,c_j) here Ξ©\Omega is the probability space of all realizations R={P^1,…,P^n}R=\{\hat{P}_1,\dots, \hat{P}_n\} of given uncertain points and prob(R)=∏i=1nprob(P^i).prob(R)=\prod_{i=1}^n prob(\hat{P}_i). In restricted assigned version of this problem, an assignment A:{P1,…,Pn}β†’{c1,…,ck}A:\{P_1,\dots, P_n\}\rightarrow \{c_1,\dots, c_k\} is given for any choice of centers and the goal is to minimize EcostA(c1,…,ck)=βˆ‘R∈Ωprob(R)max⁑i=1,…,nd(P^i,A(Pi)).Ecost_A(c_1,\dots, c_k)=\sum_{R\in \Omega} prob(R)\max_{i=1,\dots, n} d(\hat{P}_i,A(P_i)). In unrestricted version, the assignment is not specified and the goal is to compute kk centers {c1,…,ck}\{c_1,\dots, c_k\} and an assignment AA that minimize the above expected cost. We give several improved constant approximation factor algorithms for the assigned versions of this problem in a Euclidean space and in a general metric space. Our results significantly improve the results of \cite{guh} and generalize the results of \cite{wang} to any dimension. Our approach is to replace a certain center point for each uncertain point and study the properties of these certain points. The proposed algorithms are efficient and simple to implement

    Sequentially Cohen-Macaulay matroidal ideals

    Full text link
    Let R=K[x1,...,xn]R=K[x_1,...,x_n] be the polynomial ring in nn variables over a field KK and let JJ be a matroidal ideal of degree dd in RR. In this paper, we study the class of sequentially Cohen-Macaulay matroidal ideals. In particular, all sequentially Cohen-Macaulay matroidal ideals of degree 22 are classified. Furthermore, we give a classification of sequentially Cohen-Macaulay matroidal ideals of degree dβ‰₯3d\geq 3 in some special cases.Comment: 12 pages, Comments are welcome

    A Deep Learning Anomaly Detection Method in Textual Data

    Full text link
    In this article, we propose using deep learning and transformer architectures combined with classical machine learning algorithms to detect and identify text anomalies in texts. Deep learning model provides a very crucial context information about the textual data which all textual context are converted to a numerical representation. We used multiple machine learning methods such as Sentence Transformers, Auto Encoders, Logistic Regression and Distance calculation methods to predict anomalies. The method are tested on the texts data and we used syntactic data from different source injected into the original text as anomalies or use them as target. Different methods and algorithm are explained in the field of outlier detection and the results of the best technique is presented. These results suggest that our algorithm could potentially reduce false positive rates compared with other anomaly detection methods that we are testing.Comment: 8 Pages, 4 Figure
    • …
    corecore